41 research outputs found

    Synergizing Roadway Infrastructure Investment with Digital Infrastructure for Infrastructure-Based Connected Vehicle Applications: Review of Current Status and Future Directions

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.The safety, mobility, environmental and economic benefits of Connected and Autonomous Vehicles (CAVs) are potentially dramatic. However, realization of these benefits largely hinges on the timely upgrading of the existing transportation system. CAVs must be enabled to send and receive data to and from other vehicles and drivers (V2V communication) and to and from infrastructure (V2I communication). Further, infrastructure and the transportation agencies that manage it must be able to collect, process, distribute and archive these data quickly, reliably, and securely. This paper focuses on current digital roadway infrastructure initiatives and highlights the importance of including digital infrastructure investment alongside more traditional infrastructure investment to keep up with the auto industry's push towards this real time communication and data processing capability. Agencies responsible for transportation infrastructure construction and management must collaborate, establishing national and international platforms to guide the planning, deployment and management of digital infrastructure in their jurisdictions. This will help create standardized interoperable national and international systems so that CAV technology is not deployed in a haphazard and uncoordinated manner

    Bringing a CURE into a Discrete Mathematics Course and Beyond

    Get PDF
    Course-based Undergraduate Research Experiences (CUREs) have been well developed in the hard sciences, but math CUREs are all but absent from the literature. Like biology and chemistry, math programs suffer from a lack of research experiences and many students are not able to participate in programs like REUs (Research Experiences for Undergraduates). CUREs are a great alternative, but the current definition of CURE (see [1]) has potential barriers when applied to mathematics (e.g. time, novelty of project). Our solution to these barriers was to develop a math CURE pathway in which students complete Math CUREs in targeted courses. After finishing the pathway (or part of the pathway), students complete a research project in at least one of the following areas: Lie theory, representation theory, or combinatorics. The focus of this paper is the math CURE implemented in a discrete mathematics course for math and computer science majors. We share our experiences with the development and implementation of this CURE over several iterations as well as the impact of the CURE on students experiences through participant survey data obtained from this CURE

    Development and Performance Evaluation of a Connected Vehicle Application Development Platform (CVDeP)

    Get PDF
    Connected vehicle (CV) application developers need a development platform to build, test and debug real-world CV applications, such as safety, mobility, and environmental applications, in edge-centric cyber-physical systems. Our study objective is to develop and evaluate a scalable and secure CV application development platform (CVDeP) that enables application developers to build, test and debug CV applications in realtime. CVDeP ensures that the functional requirements of the CV applications meet the corresponding requirements imposed by the specific applications. We evaluated the efficacy of CVDeP using two CV applications (one safety and one mobility application) and validated them through a field experiment at the Clemson University Connected Vehicle Testbed (CU-CVT). Analyses prove the efficacy of CVDeP, which satisfies the functional requirements (i.e., latency and throughput) of a CV application while maintaining scalability and security of the platform and applications

    Consistent Online Backup in Transactional File Systems

    Get PDF
    The backup taken of a file system must be consistent, preserving data integrity across files in the file system. With file system sizes getting very large, and with demand for continuous access to data, backup has to be taken when the file system is active (is online). Arbitrarily taken online backup may result in an inconsistent backup copy. We propose a scheme referred to as mutual serializability to take a consistent backup of an active file system assuming that the file system supports transactions. The scheme extends the set of conflicting operations to include read-read conflicts, and it is shown that if the backup transaction is mutually serializable with every other transaction individually, a consistent backup copy is obtained. The user transactions continue to serialize within themselves using some standard concurrency control protocol such as Strict 2PL. We put our scheme into a formal framework to prove its correctness, and the formalization as well as the correctness proof are independent of the concurrency control protocol used to serialize user transactions. The scheme has been implemented and experiments show that consistent online backup is possible with reasonable overhead

    Artificial neural network to determine dynamic effect in capillary pressure relationship for two-phase flow in porous media with micro-heterogeneities

    Get PDF
    Open access articleAn artificial neural network (ANN) is presented for computing a parameter of dynamic two-phase flow in porous media with water as wetting phase, namely, dynamic coefficient (τ), by considering micro-heterogeneity in porous media as a key parameter. τ quantifies the dependence of time derivative of water saturation on the capillary pressures and indicates the rates at which a two-phase flow system may reach flow equilibrium. Therefore, τ is of importance in the study of dynamic two-phase flow in porous media. An attempt has been made in this work to reduce computational and experimental effort by developing and applying an ANN which can predict the dynamic coefficient through the “learning” from available data. The data employed for testing and training the ANN have been obtained from computational flow physics-based studies. Six input parameters have been used for the training, performance testing and validation of the ANN which include water saturation, intensity of heterogeneity, average permeability depending on this intensity, fluid density ratio, fluid viscosity ratio and temperature. It is found that a 15 neuron, single hidden layer ANN can characterize the relationship between media heterogeneity and dynamic coefficient and it ensures a reliable prediction of the dynamic coefficient as a function of water saturation

    Real-time motion planning methods for autonomous on-road driving: State-of-the-art and future research directions

    Get PDF
    Open access articleCurrently autonomous or self-driving vehicles are at the heart of academia and industry research because of its multi-faceted advantages that includes improved safety, reduced congestion,lower emissions and greater mobility. Software is the key driving factor underpinning autonomy within which planning algorithms that are responsible for mission-critical decision making hold a significant position. While transporting passengers or goods from a given origin to a given destination, motion planning methods incorporate searching for a path to follow, avoiding obstacles and generating the best trajectory that ensures safety, comfort and efficiency. A range of different planning approaches have been proposed in the literature. The purpose of this paper is to review existing approaches and then compare and contrast different methods employed for the motion planning of autonomous on-road driving that consists of (1) finding a path, (2) searching for the safest manoeuvre and (3) determining the most feasible trajectory. Methods developed by researchers in each of these three levels exhibit varying levels of complexity and performance accuracy. This paper presents a critical evaluation of each of these methods, in terms of their advantages/disadvantages, inherent limitations, feasibility, optimality, handling of obstacles and testing operational environments. Based on a critical review of existing methods, research challenges to address current limitations are identified and future research directions are suggested so as to enhance the performance of planning algorithms at all three levels. Some promising areas of future focus have been identified as the use of vehicular communications (V2V and V2I) and the incorporation of transport engineering aspects in order to improve the look-ahead horizon of current sensing technologies that are essential for planning with the aim of reducing the total cost of driverless vehicles. This critical review on planning techniques presented in this paper, along with the associated discussions on their constraints and limitations, seek to assist researchers in accelerating development in the emerging field of autonomous vehicle research

    Road pollution estimation using static cameras and neural networks

    Get PDF
    Este artículo presenta una metodología para estimar la contaminación en carreteras mediante el análisis de secuencias de video de tráfico. El objetivo es aprovechar la gran red de cámaras IP existente en el sistema de carreteras de cualquier estado o país para estimar la contaminación en cada área. Esta propuesta utiliza redes neuronales de aprendizaje profundo para la detección de objetos, y un modelo de estimación de contaminación basado en la frecuencia de vehículos y su velocidad. Los experimentos muestran prometedores resultados que sugieren que el sistema se puede usar en solitario o combinado con los sistemas existentes para medir la contaminación en carreteras.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Editorial

    Get PDF
    Phenomenal advances in computer technology together with progress in the areas of mathematical modelling in recent decades have made simulation procedures very powerful tools for the analysis and design of almost all types of industrial and economic processes. Accurate and reliable predictions about the outcome of complex natural transport processes and performance of novel designs for industrial equipments are routinely made using modern simulation methodologies. Annually held international Industrial Simulation Conferences (ISC), organised and run by the EUROSIS in conjunction with various organisations, provides an important forum for the presentation and exchange of new ideas related to the development and application of computer simulation techniques in a very diverse and wide ranging area of industrial relevance

    Super- resolution of 3D MRI corrupted by heavy noise with the median filter transform

    Get PDF
    The acquisition of 3D MRIs is adversely affected by many degrading factors including low spatial resolution and noise. Image enhancement techniques are commonplace, but there are few proposals that address the increase of the spatial resolution and noise removal at the same time. An algorithm to address this vital need is proposed in this presented work. The proposal tiles the 3D image space into parallelepipeds, so that a median filter is applied in each parallelepiped. The results obtained from several such tilings are then combined by a subsequent median computation. The convergence properties of the proposed method are formally proved. Experimental results with both synthetic and real images demonstrate our approach outperforms its competitors for images with high noise levels. Moreover, it is demonstrated that our algorithm does not generate any hallucinations.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Are Public Intrusion Datasets Fit for Purpose: Characterising the State of the Art in Intrusion Event Datasets

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.In recent years cybersecurity attacks have caused major disruption and information loss for online organisations, with high profile incidents in the news. One of the key challenges in advancing the state of the art in intrusion detection is the lack of representative datasets. These datasets typically contain millions of time-ordered events (e.g. network packet traces, flow summaries, log entries); subsequently analysed to identify abnormal behavior and specific attacks [1]. Generating realistic datasets has historically required expensive networked assets, specialised traffic generators, and considerable design preparation. Even with advances in virtualisation it remains challenging to create and maintain a representative environment. Major improvements are needed in the design, quality and availability of datasets, to assist researchers in developing advanced detection techniques. With the emergence of new technology paradigms, such as intelligent transport and autonomous vehicles, it is also likely that new classes of threat will emerge [2]. Given the rate of change in threat behavior [3] datasets become quickly obsolete, and some of the most widely cited datasets date back over two decades. Older datasets have limited value: often heavily filtered and anonymised, with unrealistic event distributions, and opaque design methodology. The relative scarcity of (Intrusion Detection System) IDS datasets is compounded by the lack of a central registry, and inconsistent information on provenance. Researchers may also find it hard to locate datasets or understand their relative merits. In addition, many datasets rely on simulation, originating from academic or government institutions. The publication process itself often creates conflicts, with the need to de-identify sensitive information in order to meet regulations such as General Data Protection Act (GDPR) [4]. Another final issue for researchers is the lack of standardised metrics with which to compare dataset quality. In this paper we attempt to classify the most widely used public intrusion datasets, providing references to archives and associated literature. We illustrate their relative utility and scope, highlighting the threat composition, formats, special features, and associated limitations. We identify best practice in dataset design, and describe potential pitfalls of designing anomaly detection techniques based on data that may be either inappropriate, or compromised due to unrealistic threat coverage. Such contributions as made in this paper is expected to facilitate continuous research and development for effectively combating the constantly evolving cyber threat landscape
    corecore